Berry–Esseen bounds for Chernoff-type nonstandard asymptotics in isotonic regression

نویسندگان

چکیده

A Chernoff-type distribution is a nonnormal defined by the slope at zero of greatest convex minorant two-sided Brownian motion with polynomial drift. While known to appear as distributional limit in many nonregular statistical estimation problems, accuracy approximations has remained largely unknown. In present paper, we tackle this problem and derive Berry–Esseen bounds for distributions canonical isotonic (or monotone) regression. The derived match those oracle local average estimator optimal bandwidth each scenario possibly different asymptotics, up multiplicative logarithmic factors. Our method proof differs from standard techniques on bounds, relies new localization regression an anti-concentration inequality supremum Lipschitz

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Risk Bounds in Isotonic Regression

Nonasymptotic risk bounds are provided for maximum likelihood-type isotonic estimators of an unknown nondecreasing regression function, with general average loss at design points. These bounds are optimal up to scale constants, and they imply uniform n−1/3-consistency of the p risk for unknown regression functions of uniformly bounded variation, under mild assumptions on the joint probability d...

متن کامل

Improved Risk Bounds in Isotonic Regression

We consider the problem of estimating an unknown non-decreasing sequence θ from finitely many noisy observations. We give an improved global risk upper bound for the isotonic least squares estimator (LSE) in this problem. The obtained risk bound behaves differently depending on the form of the true sequence θ – one gets a whole range of rates from log n/n (when θ is constant) to n−2/3 (when θ i...

متن کامل

Chernoff Bounds

If m = 2, i.e., P = (p, 1 − p) and Q = (q, 1 − q), we also write DKL(p‖q). The Kullback-Leibler divergence provides a measure of distance between the distributions P and Q: it represents the expected loss of efficiency if we encode an m-letter alphabet with distribution P with a code that is optimal for distribution Q. We can now state the general form of the Chernoff Bound: Theorem 1.1. Let X1...

متن کامل

Variational Chernoff Bounds for Graphical Models

Recent research has made significant progress on the problem of bounding log partition functions for exponential family graphical models. Such bounds have associated dual parameters that are often used as heuristic estimates of the marginal probabilities required in inference and learning. However these variational estimates do not give rigorous bounds on marginal probabilities, nor do they giv...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Applied Probability

سال: 2022

ISSN: ['1050-5164', '2168-8737']

DOI: https://doi.org/10.1214/21-aap1716